首页> 外文OA文献 >Rows vs Columns for Linear Systems of Equations - Randomized Kaczmarz or Coordinate Descent?
【2h】

Rows vs Columns for Linear Systems of Equations - Randomized Kaczmarz or Coordinate Descent?

机译:线性方程组的行与列 - 随机Kaczmarz或   坐标下降?

摘要

This paper is about randomized iterative algorithms for solving a linearsystem of equations $X \beta = y$ in different settings. Recent interest in thetopic was reignited when Strohmer and Vershynin (2009) proved the linearconvergence rate of a Randomized Kaczmarz (RK) algorithm that works on the rowsof $X$ (data points). Following that, Leventhal and Lewis (2010) proved thelinear convergence of a Randomized Coordinate Descent (RCD) algorithm thatworks on the columns of $X$ (features). The aim of this paper is to simplifyour understanding of these two algorithms, establish the direct relationshipsbetween them (though RK is often compared to Stochastic Gradient Descent), andexamine the algorithmic commonalities or tradeoffs involved with working onrows or columns. We also discuss Kernel Ridge Regression and present aKaczmarz-style algorithm that works on data points and having the advantage ofsolving the problem without ever storing or forming the Gram matrix, one of therecognized problems encountered when scaling kernelized methods.
机译:本文是关于在不同设置下求解线性方程组$ X \ beta = y $的随机迭代算法。当Strohmer和Vershynin(2009)证明了适用于$ X $行(数据点)的随机Kaczmarz(RK)算法的线性收敛速度时,重新引起了人们对该主题的兴趣。随后,Leventhal和Lewis(2010)证明了在$ X $(特征)列上起作用的随机坐标下降(RCD)算法的线性收敛性。本文的目的是简化我们对这两种算法的理解,建立它们之间的直接关系(尽管经常将RK与随机梯度下降进行比较),并检查与工作行或列有关的算法共性或权衡。我们还将讨论Kernel Ridge回归,并提出一种Kaczmarz风格的算法,该算法可处理数据点,并且具有无需存储或形成Gram矩阵即可解决问题的优点,这是缩放核化方法时遇到的公认问题之一。

著录项

  • 作者

    Ramdas, Aaditya;

  • 作者单位
  • 年度 2014
  • 总页数
  • 原文格式 PDF
  • 正文语种 {"code":"en","name":"English","id":9}
  • 中图分类

相似文献

  • 外文文献
  • 中文文献
  • 专利

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号